Pointed letters from Congress have been sent to tech executives like Apple's Tim Cook, stating concerns over the prevalence of deepfake non-consensual intimate images.
The letters stem from earlier reports about nude deepfakes being created using dual-use apps. Advertisements popped up all over social media promoting face swapping, and users were using them to place faces into nude or pornographic images.
According to a new report from 404 Media, the US Congress is taking action based on these reports, asking tech companies how it plans to stop non-consensual intimate images from being generated on their platforms. Letters were sent to Apple, Alphabet, Microsoft, Meta, ByteDance, Snap, and X.
The letter to Apple specifically calls out the company for missing such dual-use apps despite App Review Guidelines. It asks what Apple plans to do to curb the spread of deepfake pornography and mentions the TAKE IT DOWN Act.
The following questions were asked of Tim Cook:
- What plans are in place to proactively address the proliferation of deepfake pornography on your platform, and what is the timeline of deployment for those measures?
- What individuals or stakeholders, if any, are included in developing these plans?
- What is the process after a report is made and what kind of oversight exists to ensure that these reports are addressed in a timely manner?
- What is the process for determining if an application needs to be removed from your store?
- What, if any, remedy is available to users who report that their image was included non-consensually in a deepfake?
Apple acts as the steward to the App Store, and in doing so, gets the blame anytime something gets into the store that shouldn't. These instances of deepfake tools or children's apps that become casinos are used as fuel in the argument that Apple shouldn't be the gatekeeper to its app platform.
The apps identified by previous reports that were removed by Apple showed clear signs of potential abuse. For example, users could source videos from Pornhub for face swapping.
Apple Intelligence can't make nude or pornographic images, and Apple has stopped Sign-in with Apple from working on deepfake websites. However, that's just the bare minimum of what Apple can and should be doing to combat deepfakes.
As the letter from Congress suggests, Apple needs to take steps to ensure that dual-use apps can't get through review. Extra precautions should at least be taken with apps that promise AI image and video manipulation, especially those that offer face swapping capabilities.
6 Comments
What’s Congress doing?
are they extending laws to make use of deep fakes as unlawful in political advertising as it is in commercial advertising.
Is congress asking Adobe and the other photo manipulation app developers what they are doing to curb the use of their technology for the same purpose that's been in use since photo manipulation was a thing?
So that Boebert episode never really happened?
I'm not American, nor a lawyer, but doesn't the First Amendment say "Congress shall [not] abridge freedom of speech"? When they put pressure on any company to "proactively address the proliferation of deepfake pornography" isn't that a direct attempt to make companies abridge people's freedom of speech?
I recognized one signatory on the list of questions. She was a Democrat. Were all the signatories Democrats?